18,602 research outputs found
Novel six-coordinate Aryl- and Alkyltin complexes
Organo-tin compounds have wide applications as pesticides and as intermediates for organic synthesis.¹ They are invariably Sn(IV) derivatives and are generally four-coordinate.² The mixed organo/chioro compounds of the type RnSnCI4-n do however have the ability to expand their coordination numbers to five or six. This depends critically on the substituents - with four organic groups, R₄Sn, there is no tendency at all to coordinate extra ligands, while at the other extreme SnCl₄ readily forms six-coordinate [SnC1₄L₂] complexes since the electronegative halo groups increase the Lewis acidity of the tin centre
Dental caries experience in the Australian adult population
The definitive version is available at www.blackwell-synergy.com The document attached has been archived with permission from the Australian Dental Association (18th Jan 2007). An external link to the publisher’s copy is included
Sensitivity-analysis method for inverse simulation application
An important criticism of traditional methods of inverse simulation that are based on the Newton–Raphson algorithm is that they suffer from numerical problems. In this paper these problems are discussed and a new method based on sensitivity-analysis theory is developed and evaluated. The Jacobian matrix may be calculated by solving a sensitivity equation and this has advantages over the approximation methods that are usually applied when the derivatives of output variables with respect to inputs cannot be found analytically. The methodology also overcomes problems of input-output redundancy that arise in the traditional approaches to inverse simulation. The sensitivity- analysis approach makes full use of information within the time interval over which key quantities are compared, such as the difference between calculated values and the given ideal maneuver after each integration step. Applications to nonlinear HS125 aircraft and Lynx helicopter models show that, for this sensitivity-analysis method, more stable and accurate results are obtained than from use of the traditional Newton–Raphson approach
Conceptual design of an airborne laser Doppler velocimeter system for studying wind fields associated with severe local storms
An airborne laser Doppler velocimeter was evaluated for diagnostics of the wind field associated with an isolated severe thunderstorm. Two scanning configurations were identified, one a long-range (out to 10-20 km) roughly horizontal plane mode intended to allow probing of the velocity field around the storm at the higher altitudes (4-10 km). The other is a shorter range (out to 1-3 km) mode in which a vertical or horizontal plane is scanned for velocity (and possibly turbulence), and is intended for diagnostics of the lower altitude region below the storm and in the out-flow region. It was concluded that aircraft flight velocities are high enough and severe storm lifetimes are long enough that a single airborne Doppler system, operating at a range of less than about 20 km, can view the storm area from two or more different aspects before the storm characteristics change appreciably
Velocity, energy and helicity of vortex knots and unknots
In this paper we determine the velocity, the energy and estimate writhe and
twist helicity contributions of vortex filaments in the shape of torus knots
and unknots (toroidal and poloidal coils) in a perfect fluid. Calculations are
performed by numerical integration of the Biot-Savart law. Vortex complexity is
parametrized by the winding number , given by the ratio of the number of
meridian wraps to that of the longitudinal wraps. We find that for vortex
knots and toroidal coils move faster and carry more energy than a reference
vortex ring of same size and circulation, whereas for knots and poloidal
coils have approximately same speed and energy of the reference vortex ring.
Helicity is dominated by the writhe contribution. Finally, we confirm the
stabilizing effect of the Biot-Savart law for all knots and unknots tested,
that are found to be structurally stable over a distance of several diameters.
Our results also apply to quantized vortices in superfluid He.Comment: 17 pages, 8 figures, 2 table
Towards a framework for the integration of information security into undergraduate computing curricula
With the rapid rise of the world’s reliance on technology, organisations are facing an increased demand for a security savvy workforce. It is, therefore, important that computing graduates possess the necessary information security skills, knowledge and understanding that can enable them to perform their organisational roles and responsibilities in a secure manner. The information security skills, knowledge and understanding can be acquired through a computing qualification that is offered at a higher education institution. The ACM/IEEE, as a key role player that provides educational guidelines for the development of computing curricula, recommends that information security should be pervasively integrated into the curriculum. However, its guidelines and recommendations do not provide sufficient guidance on “how” this can be done. This study therefore, proposes a framework to address the pervasive integration of information security into computing curricula. Various research methods were used in this study. Firstly, a literature review was undertaken to inform the various phases and elements of the proposed framework. The literature reviewed included relevant information security education standards and best practices, including key computing curricular guidelines. Secondly, a survey in the form of semi-structured interviews supported by a questionnaire were used to elicit computing educators’ perspectives on information security education in a South African context, including the perceived challenges and ideas on how to integrate information security into the curricula. Finally, elite interviews were conducted to validate the proposed framework. It is envisaged that the proposed framework can assist computing departments and undergraduate computing educators in the integration of information security into the curricula thereby helping to ensure that computing graduates exit higher education institutions possessing the necessary information security skills, knowledge and understanding to enable them to perform their roles and responsibilities securely
Quantum Analogue Computing
We briefly review what a quantum computer is, what it promises to do for us,
and why it is so hard to build one. Among the first applications anticipated to
bear fruit is quantum simulation of quantum systems. While most quantum
computation is an extension of classical digital computation, quantum
simulation differs fundamentally in how the data is encoded in the quantum
computer. To perform a quantum simulation, the Hilbert space of the system to
be simulated is mapped directly onto the Hilbert space of the (logical) qubits
in the quantum computer. This type of direct correspondence is how data is
encoded in a classical analogue computer. There is no binary encoding, and
increasing precision becomes exponentially costly: an extra bit of precision
doubles the size of the computer. This has important consequences for both the
precision and error correction requirements of quantum simulation, and
significant open questions remain about its practicality. It also means that
the quantum version of analogue computers, continuous variable quantum
computers (CVQC) becomes an equally efficient architecture for quantum
simulation. Lessons from past use of classical analogue computers can help us
to build better quantum simulators in future.Comment: 10 pages, to appear in the Visions 2010 issue of Phil. Trans. Roy.
Soc.
Findings from a pilot randomised trial of an asthma internet self-management intervention (RAISIN)
<b>Objective </b>To evaluate the feasibility of a phase 3
randomised controlled trial (RCT) of a website (Living
Well with Asthma) to support self-management.<p></p>
<b>Design and setting</b> Phase 2, parallel group, RCT,
participants recruited from 20 general practices across
Glasgow, UK. Randomisation through automated voice
response, after baseline data collection, to website
access for minimum 12 weeks or usual care.<p></p>
<b>Participants </b>Adults (age≥16 years) with physician
diagnosed, symptomatic asthma (Asthma Control
Questionnaire (ACQ) score ≥1). People with unstable
asthma or other lung disease were excluded.<p></p>
<b>Intervention</b> Living Well with Asthma’ is a desktop/
laptop compatible interactive website designed with
input from asthma/ behaviour change specialists, and
adults with asthma. It aims to support optimal
medication management, promote use of action plans,
encourage attendance at asthma reviews and increase
physical activity.<p></p>
<b>Outcome measures</b> Primary outcomes were
recruitment/retention, website use, ACQ and mini-
Asthma Quality of Life Questionnaire (AQLQ).
Secondary outcomes included patient activation,
prescribing, adherence, spirometry, lung inflammation
and health service contacts after 12 weeks. Blinding
postrandomisation was not possible.<p></p>
<b>Results </b>Recruitment target met. 51 participants
randomised (25 intervention group). Age range
16–78 years; 75% female; 28% from most deprived
quintile. 45/51 (88%; 20 intervention group) followed
up. 19 (76% of the intervention group) used the
website, for a mean of 18 min (range 0–49). 17 went
beyond the 2 ‘core’ modules. Median number of logins
was 1 (IQR 1–2, range 0–7). No significant difference
in the prespecified primary efficacy measures of ACQ
scores (−0.36; 95% CI −0.96 to 0.23; p=0.225), and
mini-AQLQ scores (0.38; −0.13 to 0.89; p=0.136). No
adverse events.<p></p>
<b>Conclusions</b> Recruitment and retention confirmed
feasibility; trends to improved outcomes suggest use of
Living Well with Asthma may improve self-management
in adults with asthma and merits further development
followed by investigation in a phase 3 trial
Calculation of Elastic Green's Functions for Lattices with Cavities
In this Brief Report, we present an algorithm for calculating the elastic
Lattice Greens Function of a regular lattice, in which defects are created by
removing lattice points. The method is computationally efficient, since the
required matrix operations are on matrices that scale with the size of the
defect subspace, and not with the size of the full lattice. This method allows
the treatment of force fields with multi-atom interactions.Comment: 3 pages. RevTeX, using epsfig.sty. One figur
- …